feat: add @libre.graph.motionPhoto facet to driveItem#36
feat: add @libre.graph.motionPhoto facet to driveItem#36
Conversation
Added properties for Motion Photo metadata including version and presentation timestamp.
There was a problem hiding this comment.
Pull request overview
This PR extends the photo facet schema in the OpenAPI spec to describe Motion Photo metadata using the existing @libre.graph.* extension pattern.
Changes:
- Add
@libre.graph.motionPhoto(int32) to indicate whether a file is a Motion Photo. - Add
@libre.graph.motionPhotoVersion(int32) and@libre.graph.motionPhotoPresentationTimestampUs(int64) as read-only Motion Photo metadata fields. - Document semantics and mapping to Google’s Motion Photo XMP fields in the schema descriptions.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
I'm not sure about the prefix - how strict are we about adding those @libre.graph prefixes? But if you insist it's doable of course |
Added motionPhotoVideoSize property to the OpenAPI spec.
|
I've added |
butonic
left a comment
There was a problem hiding this comment.
Can you model this as a motionPhoto facet like the photo facet.
If a drive item has a motionPhoto facet (property) it is a motion photo = @libre.graph.motionPhoto=true
All other properties become shorter:
{
"id": "some-file-uuid-value",
"@libre.graph.motionPhoto": {
"version": 1,
"presentationTimestampUs": -1
{
}And yes, we require the @libre.graph. prefix to know which properties have been added to the ms graph spec by us.
Okay, I'm fine with the prefix of course. Just wasn't sure how strict you are handling it and if you expect any client to be truly compatible with MS Graph and Libre Graph ootb. I went back and forth on whether I want it to be its own facet or not - I'm not very emotional, having shorter individual property names sounds nice. Fine with me as well. |
Per review feedback on #36: move Motion Photo metadata from flat properties on the `photo` facet to a dedicated `@libre.graph.motionPhoto` facet on `driveItem`, parallel to `photo`/`audio`/`video`. Presence of the facet on a driveItem signals that it is a Motion Photo, so the separate `@libre.graph.motionPhoto` (int flag) property is dropped. Nested properties lose the `motionPhoto*` prefix since they're now scoped by the facet: `version`, `presentationTimestampUs`, `videoSize`. Consistent with how MS Graph models file/folder/photo/audio/video facets (presence = type signal) and leaves room for future Motion Photo metadata without polluting the `photo` facet's namespace.
|
Refactored per @butonic's suggestion — motion-photo data now lives in a dedicated Pushed as |
Description
What are Motion Photos?
Motion Photos are image files (JPEG, HEIC, or AVIF) with a short video clip (typically 1.5-3 seconds) appended to the end of the file. When you take a photo on a Google Pixel or Samsung Galaxy device, by default the camera captures a brief video around the moment of the shutter press and embeds it directly into the image file. The result is a single file that works as a normal photo everywhere, but can also play back the embedded video in apps that support it - similar to Apple's Live Photos, but as a single file rather than a separate image/video pair.
The format is specified by Google as the Motion Photo format v1.0. It uses XMP metadata in the
Cameranamespace (http://ns.google.com/photos/1.0/camera/) to signal that a file is a Motion Photo and to describe how the still image and video relate to each other.Samsung devices have converged on the same XMP metadata format, making this the de facto standard across the majority of Android devices.
What this PR adds
Three new read-only properties on the
photoschema, following the@libre.graph.*extension pattern for properties not present in the MS Graph API:@libre.graph.motionPhotointeger(int32)1if the file is a Motion Photo,0or absent if not. Maps toCamera:MotionPhotoin the spec.@libre.graph.motionPhotoVersioninteger(int32)1). Maps toCamera:MotionPhotoVersion.@libre.graph.motionPhotoPresentationTimestampUsinteger(int64)-1means unspecified. Maps toCamera:MotionPhotoPresentationTimestampUs.Example response
{ "photo": { "cameraMake": "Google", "cameraModel": "Pixel 8 Pro", "iso": 58, "takenDateTime": "2024-03-15T14:22:07Z", "@libre.graph.motionPhoto": 1, "@libre.graph.motionPhotoVersion": 1, "@libre.graph.motionPhotoPresentationTimestampUs": 1245580 } }Implementation plan
The overall goal is to let the OpenCloud web frontend detect Motion Photos and play back the embedded video when a user opens one.
Backend (opencloud repo)
Search service / Tika extractor (
services/search/pkg/content/tika.go):Camera:MotionPhoto,Camera:MotionPhotoVersion, andCamera:MotionPhotoPresentationTimestampUsfields should be available in the Tika metadata map.getPhoto()function to also read these fields and set them on thePhotostruct.GCamera:MicroVideoformat (Pixel 2/3 era) and Samsung'sEmbeddedVideoType: MotionPhoto_Databy normalizing them tomotionPhoto = 1.marshalToStringMapneeds to strip the@libre.graph.prefix from JSON tag names before building the storage key, so that@libre.graph.motionPhotois stored aslibre.graph.photo.motionPhoto, notlibre.graph.photo.@libre.graph.motionPhoto.Graph service (
services/graph/pkg/service/v0/driveitems.go):unmarshalStringMapneeds the same prefix-stripping when looking up keys in the metadata map. Once that is in place and thelibre-graph-api-gomodel is regenerated, the new fields flow through automatically via the existing reflection-based marshalling.Reva (
internal/http/services/owncloud/ocdav/propfind/propfind.go):motionPhoto,motionPhotoVersion, andmotionPhotoPresentationTimestampUsto thephotoKeysslice.appendMetadataPropalready builds storage keys from the key list, so no further changes needed there beyond the new entries.Frontend (web repo)
Preview app (
packages/web-app-preview/):photo['@libre.graph.motionPhoto'] === 1.ftypsignature in the bytes, and play it in a<video>element.Out of scope for now
ContentIdentifier. Supporting them requires a fundamentally different approach (file relationship management) and is tracked separately.References
I'm willing to do the implementation